On Tsallis Relative Entropy Rate of Hidden Markov Models
نویسندگان
چکیده
منابع مشابه
Relative Entropy Rate between a Markov Chain and Its Corresponding Hidden Markov Chain
In this paper we study the relative entropy rate between a homogeneous Markov chain and a hidden Markov chain defined by observing the output of a discrete stochastic channel whose input is the finite state space homogeneous stationary Markov chain. For this purpose, we obtain the relative entropy between two finite subsequences of above mentioned chains with the help of the definition of...
متن کاملTaylor Expansion for the Entropy Rate of Hidden Markov Chains
We study the entropy rate of a hidden Markov process, defined by observing the output of a symmetric channel whose input is a first order Markov process. Although this definition is very simple, obtaining the exact amount of entropy rate in calculation is an open problem. We introduce some probability matrices based on Markov chain's and channel's parameters. Then, we try to obtain an estimate ...
متن کاملThe Relative Entropy Rate For Two Hidden Markov Processes
The relative entropy rate is a natural and useful measure of distance between two stochastic processes. In this paper we study the relative entropy rate between two Hidden Markov Processes (HMPs), which is of both theoretical and practical importance. We give new results showing analyticity, representation using Lyapunov exponents, and Taylor expansion for the relative entropy rate of two discr...
متن کاملA note on inequalities for Tsallis relative operator entropy
In this short note, we present some inequalities for relative operator entropy which are generalizations of some results obtained by Zou [Operator inequalities associated with Tsallis relative operator entropy, {em Math. Inequal. Appl.} {18} (2015), no. 2, 401--406]. Meanwhile, we also show some new lower and upper bounds for relative operator entropy and Tsallis relative o...
متن کاملRelative entropy between Markov transition rate matrices
We derive the relative entropy between two Markov transition rate matrices from sample path considerations. This relative entropy is interpreted as a \level 2.5" large deviations action functional. That is, the level two large deviations action functional for empirical distributions of continuous-time Markov chains can be derived from the relative entropy using the contraction mapping principle...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Statistical Research of Iran
سال: 2018
ISSN: 1735-1294
DOI: 10.29252/jsri.15.1.83